翻訳と辞書
Words near each other
・ Nokia 100
・ Nokia 101
・ Nokia 1011
・ Nokia 103
・ Nokia 105
・ Nokia 106
・ Nokia 1100
・ Nokia 1110
・ Nokia 1112
・ Nokia 1202
・ Noisy scrubbird
・ Noisy Stylus
・ Noisy text
・ Noisy text analytics
・ Noisy Water Winery
Noisy-channel coding theorem
・ Noisy-Diobsud Wilderness
・ Noisy-le-Grand
・ Noisy-le-Roi
・ Noisy-le-Sec
・ Noisy-le-Sec (Paris RER)
・ Noisy-Rudignon
・ Noisy-storage model
・ Noisy-sur-Oise
・ Noisy-sur-École
・ Noita
・ Noita (album)
・ Noita palaa elämään
・ Noitamina
・ Noite (novel)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Noisy-channel coding theorem : ウィキペディア英語版
Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.
The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level.
== Overview ==

Stated by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption. Shannon's theorem has wide-ranging applications in both communications and data storage. This theorem is of foundational importance to the modern field of information theory. Shannon only gave an outline of the proof. The first rigorous proof for the discrete case is due to Amiel Feinstein in 1954.
The Shannon theorem states that given a noisy channel with channel capacity ''C'' and information transmitted at a rate ''R'', then if R < C there exist codes that allow the probability of error at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, ''C''.
The converse is also important. If R > C, an arbitrarily small probability of error is not achievable. All codes will have a probability of error greater than a certain positive minimal level, and this level increases as the rate increases. So, information cannot be guaranteed to be transmitted reliably across a channel at rates beyond the channel capacity. The theorem does not address the rare situation in which rate and capacity are equal.
The channel capacity C can be calculated from the physical properties of a channel; for a band-limited channel with Gaussian noise, using the Shannon–Hartley theorem.
Simple schemes such as "send the message 3 times and use a best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods, unable to asymptotically guarantee that a block of data can be communicated free of error. Advanced techniques such as Reed–Solomon codes and, more recently, low-density parity-check (LDPC) codes and turbo codes, come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. Using these highly efficient codes and with the computing power in today's digital signal processors, it is now possible to reach very close to the Shannon limit. In fact, it was shown that LDPC codes can reach within 0.0045 dB of the Shannon limit (for binary AWGN channels, with very long block lengths).〔Sae-Young Chung, G. David Forney, Jr., Thomas J. Richardson, and Rüdiger Urbanke, "(On the Design of Low-Density Parity-Check Codes within 0.0045 dB of the Shannon Limit )", ''IEEE Communications Letters'', 5: 58-60, Feb. 2001. ISSN 1089-7798〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Noisy-channel coding theorem」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.